Efficiency for Regularization Parameter Selection in Penalized Likelihood Estimation of Misspecified Models
نویسندگان
چکیده
It has been shown that AIC-type criteria are asymptotically efficient selectors of the tuning parameter in non-concave penalized regression methods under the assumption that the population variance is known or that a consistent estimator is available. We relax this assumption to prove that AIC itself is asymptotically efficient and we study its performance in finite samples. In classical regression, it is known that AIC tends to select overly complex models when the dimension of the maximum candidate model is large relative to the sample size. Simulation studies suggest that AIC suffers from the same shortcomings when used in penalized regression. We therefore propose the use of the classical corrected AIC (AICc) as an alternative and prove that it maintains the desired asymptotic properties. To broaden our results, we further prove the efficiency of AIC for penalized likelihood methods in the context of generalized linear models with no dispersion parameter. Similar results exist in the literature but only for a restricted set of candidate models. By employing results from the classical literature on maximumlikelihood estimation in misspecified models, we are able to establish this result for a general set of candidate models. We use simulations to assess the performance of AIC and AICc, as well as that of other selectors, in finite samples for both SCAD-penalized 1 ar X iv :1 30 2. 20 68 v1 [ st at .M L ] 8 F eb 2 01 3 and Lasso regressions and a real data example is considered.
منابع مشابه
Penalized Bregman Divergence Estimation via Coordinate Descent
Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...
متن کاملRegularization Parameter Selection in the Group Lasso
This article discusses the problem of choosing a regularization parameter in the group Lasso proposed by Yuan and Lin (2006), an l1-regularization approach for producing a block-wise sparse model that has been attracted a lot of interests in statistics, machine learning, and data mining. It is important to choose an appropriate regularization parameter from a set of candidate values, because it...
متن کاملEstimation and model selection of semiparametric multivariate survival functions under general censorship.
We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root-n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple c...
متن کاملSome Problems in Model Selection 1
This dissertation consists of three parts: the first two parts are related to smoothing spline ANOVA models; the third part concerns the Lasso and its related procedures in model selection. In Part I, by adopting the Cox proportional hazard model to quantify the hazard function, we propose a novel nonparametric model selection technique to analyze time to event data, within the framework of smo...
متن کاملLarge-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation
In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...
متن کامل